Bounds on the Generalization Performance of Kernel Machines Ensembles

نویسندگان

  • Theodoros Evgeniou
  • Luis Perez-Breva
چکیده

We study the problem of learning using combinations of machines. In particular we present new theoretical bounds on the generalization performance of voting ensembles of kernel machines. Special cases considered are bagging and support vector machines. We present experimental results supporting the theoretical bounds, and describe characteristics of kernel machines ensembles suggested from the experimental findings. We also show how such ensembles can be used for fast training with very large datasets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds on the Generalization Performance of Kernel Machine Ensembles

We study the problem of learning using combinations of machines. In particular we present new theoretical bounds on the generalization performance of voting ensembles of kernel machines. Special cases considered are bagging and support vector machines. We present experimental results supporting the theoretical bounds, and describe characteristics of kernel machines ensembles suggested from the ...

متن کامل

Learning with kernel machine architectures

This thesis studies the problem of supervised learning using a family of machines, namely kernel learning machines. A number of standard learning methods belong to this family, such as Regularization Networks (RN) and Support Vector Machines (SVM). The thesis presents a theoretical justification of these machines within a unified framework based on the statistical learning theory of Vapnik. The...

متن کامل

Generalization Bounds for Convex Combinations of Kernel Functions

We derive new bounds on covering numbers for hypothesis classes generated by convex combinations of basis functions. These are useful in bounding the generalization performance of algorithms such as RBF-networks, boosting and a new class of linear programming machines similar to SV machines. We show that p-convex combinations with p > 1 lead to diverging bounds, whereas for p = 1 good bounds in...

متن کامل

A Note on the Generalization Performance of Kernel Classifiers with Margin

We present distribution independent bounds on the generalization misclassification performance of a family of kernel classifiers with margin. Support Vector Machine classifiers (SVM) stem out of this class of machines. The bounds are derived through computations of the Vγ dimension of a family of loss functions where the SVM one belongs to. Bounds that use functions of margin distributions (i.e...

متن کامل

Generalization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150

We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the eld of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly innnite dimension...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999